I have configured alertmanager to send events to pagerduty.
The configuration works (almost), when an alert is triggered in AM the event is generated in PD but the severity is not properly propagated.
I tried using templates taking the value from here .CommonLabels.severity, and even manually set a fixed value in the configuration.
In PD the event is always set to critical, and the urgency is set to low, this happens in both cases.
Here is the AM configuration:
apiVersion: monitoring.coreos.com/v1alpha1
kind: AlertmanagerConfig
metadata:
name: istio-alert-manager-config
namespace: istio-system
labels:
alertmanager: config
spec:
route:
groupBy: [application, alertname]
groupWait: 15s
groupInterval: 5m
repeatInterval: 15h
receiver: service-mesh-platform-istio
routes:
- continue: true
matchers:
- name: application
value: istio
regex: false
- name: severity
value: critical
regex: false
receiver: service-mesh-platform-istio
- continue: true
matchers:
- name: application
value: istio
regex: false
receiver: service-mesh-platform-istio
receivers:
- name: service-mesh-platform-istio
pagerdutyConfigs:
- sendResolved: true
serviceKey:
name: service-mesh-pagerduty-key
key: serviceKey
description: "Alert {{`{{ .CommonLabels.alertname }}`}}"
severity: 'warning'
details:
- key: num_firing
value: '{{ "{{ .Alerts.Firing | len }}" }}'
- key: num_resolved
value: '{{ "{{ .Alerts.Resolved | len }}" }}'
- key: description
value: '{{ "{{ range .Alerts -}}" }} - {{ "{{ .Annotations.summary }}" }}{{ "{{end}}" }}'
- key: start_time
value: '{{ "{{ range .Alerts -}}" }} - {{ "{{ .StartsAt }}" }}{{ "{{end}}" }}'
- key: severity
value: '{{ "{{ (index .Alerts 0).Labels.severity }}" }}'
- key: severity_tmp
value: '{{`{{if .CommonLabels.severity}}`}}{{`{{ .CommonLabels.severity | toLower }}`}}{{`{{ else }}`}}other{{`{{ end }}`}}'
Severity field was also set like this:
severity: '{{
{{if .CommonLabels.severity}}}}{{
{{ .CommonLabels.severity | toLower }}}}{{
{{ else }}}}critical{{
{{ end }}}}'
Thanks in advance